Stratified Structure of Laplacian Eigenmaps Embedding
ثبت نشده
چکیده
We construct a locality preserving weight matrix for Laplacian eigenmaps algorithm used in dimension reduction. Our point cloud data is sampled from a low dimensional stratified space embedded in a higher dimension. Specifically, we use tools developed in local homology, persistence homology for kernel and cokernels to infer a weight matrix which captures neighborhood relations among points in the same or different strata.
منابع مشابه
Minimally Redundant Laplacian Eigenmaps
Spectral algorithms for learning low-dimensional data manifolds have largely been supplanted by deep learning methods in recent years. One reason is that classic spectral manifold learning methods often learn collapsed embeddings that do not fill the embedding space. We show that this is a natural consequence of data where different latent dimensions have dramatically different scaling in obser...
متن کاملNonlinear Manifold Learning Part II 6.454 Summary
Manifold learning addresses the problem of finding low–dimensional structure within collections of high–dimensional data. Recent interest in this problem was motivated by the development of a pair of algorithms, locally linear embedding (LLE) [6] and isometric feature mapping (IsoMap) [8]. Both methods use local, linear relationships to derive global, nonlinear structure, although their specifi...
متن کاملSupervised embedding of textual predictors with applications in clinical diagnostics for pediatric cardiology.
OBJECTIVE Electronic health records possess critical predictive information for machine-learning-based diagnostic aids. However, many traditional machine learning methods fail to simultaneously integrate textual data into the prediction process because of its high dimensionality. In this paper, we present a supervised method using Laplacian Eigenmaps to enable existing machine learning methods ...
متن کاملA Note on Markov Normalized Magnetic Eigenmaps
We note that building a magnetic Laplacian from the Markov transition matrix, rather than the graph adjacency matrix, yields several benefits for the magnetic eigenmaps algorithm. The two largest benefits are that the embedding becomes more stable as a function of the rotation parameter g, and the principal eigenvector of the magnetic Laplacian now converges to the page rank of the network as a...
متن کاملOut-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering
Several unsupervised learning algorithms based on an eigendecomposition provide either an embedding or a clustering only for given training points, with no straightforward extension for out-of-sample examples short of recomputing eigenvectors. This paper provides a unified framework for extending Local Linear Embedding (LLE), Isomap, Laplacian Eigenmaps, Multi-Dimensional Scaling (for dimension...
متن کامل